A Benchmark Study on Steepest Descent and Conjugate Gradient Methods-Line Search Conditions Combinations in Unconstrained Optimization
نویسندگان
چکیده
In this paper, it is aimed to computationally conduct a performance benchmarking for the steepest descent and three well-known conjugate gradient methods (i.e., Fletcher-Reeves, Polak-Ribiere Hestenes-Stiefel) along with six different step length calculation techniques/conditions, namely Backtracking, Armijo-Backtracking, Goldstein, weakWolfe, strongWolfe, Exact local minimizer in unconstrained optimization. To end, series of computational experiments on test function set completed using combinations those optimization line search conditions. During these experiments, number evaluations every iteration are monitored recorded all method-line condition combinations. The total then measure when combination question converges functions minimums within given convergence tolerance. Through data, data profiles created purpose reliable an efficient benchmarking. It has been determined that, set, descent-Goldstein fastest one whereas descent-exact most robust high accuracy. By making trade-off between speed robustness, identified that descent-weak Wolfe optimal choice set.
منابع مشابه
A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملConjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a de...
متن کاملSVM-Optimization and Steepest-Descent Line Search
We consider (a subclass of) convex quadratic optimization problems and analyze decomposition algorithms that perform, at least approximately, steepest-descent exact line search. We show that these algorithms, when implemented properly, are within ǫ of optimality after O(log 1/ǫ) iterations for strictly convex cost functions, and after O(1/ǫ) iterations in the general case. Our analysis is gener...
متن کاملSteepest Descent and Conjugate Gradient Methods with Variable Preconditioning
We analyze the conjugate gradient (CG) method with variable preconditioning for solving a linear system with a real symmetric positive definite (SPD) matrix of coefficients A. We assume that the preconditioner is SPD on each step, and that the condition number of the preconditioned system matrix is bounded above by a constant independent of the step number. We show that the CG method with varia...
متن کاملA New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization
In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Croatian Operational Research Review
سال: 2022
ISSN: ['1848-0225', '1848-9931']
DOI: https://doi.org/10.17535/crorr.2022.0006